-
Notifications
You must be signed in to change notification settings - Fork 275
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix Deprecation Warnings torch.GradScaler and torch.autocast #930
Conversation
…`torch.autocast("cuda", args...)` - replace all occurrences of `torch.cuda.amp.GradScaler(args...)` with `torch.GradScaler("cuda", args...)`
Codecov ReportAttention: Patch coverage is
|
@IliasChair thanks for helping us clean this up! This should be fine i would prefer to get the device instead blanket casting to cuda (there are tests that rely on cpu autocast) ie: but if the tests pass and the all the old code is using cuda.autocast anyways, it should be fine |
Hi, If adding support for CPU autocasting is a planned feature, I believe it would still make sense to merge these changes first and then create a separate issue for that later. As I see it, the codebase is largely tailored for GPU usage, particularly for more intensive tasks. Adding full CPU support would likely be a larger endavour, and it may not be worth the effort are going to use GPUs for anything machine learning related anyway. That said, if I missed anything, I'd be happy to update my PR accordingly. Best regards, |
sorry forgot to approve this! will merge once test passes! |
…`torch.autocast("cuda", args...)` (#930) - replace all occurrences of `torch.cuda.amp.GradScaler(args...)` with `torch.GradScaler("cuda", args...)` Co-authored-by: iliaschair <[email protected]> Co-authored-by: Ray <[email protected]> Former-commit-id: 9de1edb636dff0a5ca640658ee10975e91aef7df
…`torch.autocast("cuda", args...)` (#930) - replace all occurrences of `torch.cuda.amp.GradScaler(args...)` with `torch.GradScaler("cuda", args...)` Co-authored-by: iliaschair <[email protected]> Co-authored-by: Ray <[email protected]> Former-commit-id: fe6bb3248d782c629e539164d5bc7f69b9ee37d0
This PR replaces all occurrences of
torch.cuda.amp.GradScaler(args...)
andtorch.cuda.amp.autocast(args...)
withtorch.GradScaler("cuda", args...)
andtorch.autocast("cuda", args...)
respectively.This fixes the deprecation warnings during training and inference like:
see PyTorch Docs
P.S. I'm not sure about the difference between
torch.autocast("cuda", args...)
andtorch.amp.autocast("cuda", args...)
(as seen in the deprecation warning). But I would to prefer to stick to the Documentation. E.g. only usingtorch.autocast
without theamp